Members
Overall Objectives
Research Program
Application Domains
Highlights of the Year
New Software and Platforms
New Results
Partnerships and Cooperations
Dissemination
Bibliography
XML PDF e-pub
PDF e-Pub


Section: New Results

Stochastic Multiple Gradient Descent Algorithm

Participants : Jean-Antoine Désidéri, Quentin Mercier [ONERA DADS Châtillon, doctoral student] , Fabrice Poirion [ONERA DADS Châtillon, research engineer] .

We have proposed a new method for multiobjective optimization problems in which the objective functions are expressed as expectations of random functions. This method is based on an extension of the classical stochastic gradient algorithm and a deterministic multiobjective algorithm, the Multiple-Gradient Descent Algorithm (MGDA). In MGDA a descent direction common to all specified objective functions is identified through a result of convex geometry. The use of this common descent vector and the Pareto stationarity definition into the stochastic gradient algorithm makes the algorithm able to solve multiobjective problems. The mean square and almost sure convergence of this new algorithm are proven considering the classical stochastic gradient algorithm hypothesis. The algorithm efficiency is illustrated on two academic examples and its performance is compared to the deterministic MGDA algorithm coupled with a Monte-Carlo expectation estimator. A third example is treated, considering the optimization of a sandwich material under constitutive material uncertainties.